77 research outputs found

    Exploring the neural entrainment to musical rhythms and meter : a steady-state evoked potential approach

    Full text link
    Thèse de doctorat réalisé en cotutelle avec l'Université catholique de Louvain, Belgique (Faculté de médecine, Institut de Neuroscience)Percevoir et synchroniser ses mouvements à une pulsation régulière en musique est une capacité largement répandue chez l’Homme, et fondamentale aux comportements musicaux. La pulsation et la métrique en musique désignent généralement une organisation temporelle périodique perçue à partir de stimuli acoustiques complexes, et cette organisation perceptuelle implique souvent une mise en mouvement périodique spontanée du corps. Cependant, les mécanismes neuraux sous-tendant cette perception sont à l’heure actuelle encore méconnus. Le présent travail a donc eu pour objectif de développer une nouvelle approche expérimentale, inspirée par l’approche électrophysiologique des potentiels évoqués stationnaires, afin d’explorer les corrélats neuraux à la base de notre perception de la pulsation et de la métrique induite à l’écoute de rythmes musicaux. L’activité neurale évoquée en relation avec la perception d’une pulsation a été enregistrée par électroencéphalographie (EEG) chez des individus sains, dans divers contextes : (1) dans un contexte d’imagerie mentale d’une métrique appliquée de manière endogène sur un stimulus auditif, (2) dans un contexte d’induction spontanée d’une pulsation à l’écoute de patterns rythmiques musicaux, (3) dans un contexte d’interaction multisensorielle, et (4) dans un contexte de synchronisation sensorimotrice. Pris dans leur ensemble, les résultats de ces études corroborent l’hypothèse selon laquelle la perception de la pulsation en musique est sous-tendue par des processus de synchronisation et de résonance de l’activité neurale dans le cerveau humain. De plus, ces résultats suggèrent que l’approche développée dans le présent travail pourrait apporter un éclairage significatif pour comprendre les mécanismes neuraux de la perception de la pulsation et des rythmes musicaux, et, dans une perspective plus générale, pour explorer les mécanismes de synchronisation neurale.The ability to perceive a regular beat in music and synchronize to it is a widespread human skill. Fundamental to musical behavior, beat and meter refer to the perception of periodicities while listening to musical rhythms, and usually involve spontaneous entrainment to move on these periodicities. However, the neural mechanisms underlying entrainment to beat and meter in Humans remain unclear. The present work tests a novel experimental approach, inspired by the steady-state evoked potential method, to explore the neural dynamics supporting the perception of rhythmic inputs. Using human electroencephalography (EEG), neural responses to beat and meter were recorded in various contexts: (1) mental imagery of meter, (2) spontaneous induction of a beat from rhythmic patterns, (3) multisensory integration, and (4) sensorimotor synchronization. Our results support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. Furthermore, our results suggest that this novel approach could help investigating the link between the phenomenology of musical beat and meter and neurophysiological evidence of a bias towards periodicities arising under certain circumstances in the nervous system. Hence, entrainment to music provides an original framework to explore general entrainment phenomena occurring at various levels, from the inter-neural to the inter-individual level

    Lateralised dynamic modulations of corticomuscular coherence associated with bimanual learning of rhythmic patterns

    Get PDF
    Supplementary Information: The online version contains supplementary material available at https://doi.org/ 10.1038/s41598-022-10342-5Human movements are spontaneously attracted to auditory rhythms, triggering an automatic activation of the motor system, a central phenomenon to music perception and production. Cortico- muscular coherence (CMC) in the theta, alpha, beta and gamma frequencies has been used as an index of the synchronisation between cortical motor regions and the muscles. Here we investigated how learning to produce a bimanual rhythmic pattern composed of low- and high-pitch sounds affects CMC in the beta frequency band. Electroencephalography (EEG) and electromyography (EMG) from the left and right First Dorsal Interosseus and Flexor Digitorum Superficialis muscles were concurrently recorded during constant pressure on a force sensor held between the thumb and index finger while listening to the rhythmic pattern before and after a bimanual training session. During the training, participants learnt to produce the rhythmic pattern guided by visual cues by pressing the force sensors with their left or right hand to produce the low- and high-pitch sounds, respectively. Results revealed no changes after training in overall beta CMC or beta oscillation amplitude, nor in the correlation between the left and right sides for EEG and EMG separately. However, correlation analyses indicated that left- and right-hand beta EEG–EMG coherence were positively correlated over time before training but became uncorrelated after training. This suggests that learning to bimanually produce a rhythmic musical pattern reinforces lateralised and segregated cortico-muscular communication.This work was supported by a grant from the Australian Research Council (DP170104322)

    Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation

    Get PDF
    All data are held in a public repository, available at OSF database (URL access: https://osf.io/2jr48/?view_only=17e3f6f57651418c980832e00d818072).Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.We thank Ashleigh Clibborn and Ayah Hammoud for their assistance with data collection. This work was supported by a grant from the Australian Research Council (DP170104322, DP220103047). OML is supported by the Portuguese Foundation for Science and Technology and the Portuguese Ministry of Science, Technology and Higher Education, through the national funds, within the scope of the Transitory Disposition of the Decree No. 57/2016, of 29 August, amended by Law No. 57/2017 of 19 July (Ref.: SFRH/BPD/72710/2010

    Neural tracking of the musical beat is enhanced by low-frequency sounds

    Get PDF
    Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat

    Spatial and temporal (non)binding of audio-visual stimuli: effects on motor tracking and underlying neural sensory processing

    Get PDF
    Objectives: Compare the steady-state evoked potentials (SSEPs) of spatially or temporally congruent and incongruent audio-visual stimuli and evaluate how congruency affects the motion tracking of visual stimuli. Research question: Does spatial or temporal congruency of audio-visual stimuli affect motion tracking and evoke differential SSEPs? Methods: We use EEG frequency-tagging techniques to investigate the selective neural processing and integration of visual and auditory information in the tracking of a moving stimulus and how spatial and temporal (in)congruency between the two modalities modulate these sensory neural processes and synchronization performance.Participants were instructed to track a red dot flickering at 15 Hz that oscillated horizontally with a complex trajectory on a computer screen by moving their index finger. An auditory pure tone with continuous pitch modulation at 32 Hz was presented with lateralised amplitude modulations in left and right audio channels (panning) that were, in Experiment 1, either spatially congruent or incongruent (same direction vs. opposite direction vs. no panning), and in Experiment 2, either temporally congruent or incongruent (no delay vs. medium or large delay), with the oscillating visual stimulus. Results: Both experiments yielded significant EEG responses at the visual (15 Hz) and auditory (32 Hz) tagging frequencies. Further, in Experiment 1 participants had lower performance and larger amplitudes at the auditory frequency during no panning condition. No significant correlation between the two measures was found. In Experiment 2 no changes in the amplitude of the EEG responses or in performance were found. Conclusion: The movement synchronization performance and the neural processing of visual and auditory information were not influenced by phase congruency manipulation. For spatial congruency, the moving auditory stimuli led to better performance, irrespective of congruency, when compared to the non moving sound. Importantly, there were no significant responses at 17 and 47 Hz corresponding to the intermodulation frequencies of 15 and 32 Hz, suggesting an absence of global integration of visual and auditory information. These results encourage further exploration of the conditions that may result in the selective processing of visual and auditory information and their integration in the motor tracking of moving environmental objects

    Neural tracking and integration of 'self' and 'other' in improvised interpersonal coordination

    Get PDF
    Humans coordinate their movements with one another in a range of everyday activities and skill domains. Optimal joint performance requires the continuous anticipation of and adaptation to each other's movements, especially when actions are spontaneous rather than pre-planned. Here we employ dual-EEG and frequency-tagging techniques to investigate how the neural tracking of self- and other-generated movements supports interpersonal coordination during improvised motion. LEDs flickering at 5.7 and 7.7 Hz were attached to participants’ index fingers in 28 dyads as they produced novel patterns of synchronous horizontal forearm movements. EEG responses at these frequencies revealed enhanced neural tracking of self-generated movement when leading and of other-generated movements when following. A marker of self-other integration at 13.4 Hz (inter-modulation frequency of 5.7 and 7.7 Hz) peaked when no leader was designated, and mutual adaptation and movement synchrony were maximal. Furthermore, the amplitude of EEG responses reflected differences in the capacity of dyads to synchronize their movements, offering a neurophysiologically grounded perspective for understanding perceptual-motor mechanisms underlying joint action. © 2019 Elsevier Inc

    Dynamic modulation of beta band cortico-muscular coupling induced by audio-visual rhythms

    Get PDF
    Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 Hz or 2 Hz while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants’ EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12–40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG–EMG motor coherence were found for the 2 Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans

    Neural tracking of visual periodic motion

    Get PDF
    Periodicity is a fundamental property of biological systems, including human movement systems. Periodic movements support displacements of the body in the environment as well as interactions and communication between individuals. Here, we use electroencephalography (EEG) to investigate the neural tracking of visual periodic motion, and more specifically, the relevance of spatiotemporal information contained at and between their turning points. We compared EEG responses to visual sinusoidal oscillations versus nonlinear Rayleigh oscillations, which are both typical of human movements. These oscillations contain the same spatiotemporal information at their turning points but differ between turning points, with Rayleigh oscillations having an earlier peak velocity, shown to increase an individual's capacity to produce accurately synchronized movements. EEG analyses highlighted the relevance of spatiotemporal information between the turning points by showing that the brain precisely tracks subtle differences in velocity profiles, as indicated by earlier EEG responses for Rayleigh oscillations. The results suggest that the brain is particularly responsive to velocity peaks in visual periodic motion, supporting their role in conveying behaviorally relevant timing information at a neurophysiological level. The results also suggest key functions of neural oscillations in the Alpha and Beta frequency bands, particularly in the right hemisphere. Together, these findings provide insights into the neural mechanisms underpinning the processing of visual periodic motion and the critical role of velocity peaks in enabling proficient visuomotor synchronization

    Exploration of nociceptive cortical processing with steady-state evoked potentials

    Get PDF
    The periodic presentation of a sensory stimulus induces, at certain frequencies of stimulation, a sustained electroencephalographic response known as steady-state evoked potentials (SS-EP). SS-EPs are considered to reflect entrainment of cortical sensory networks resonating at the frequency of stimulation. In the present study we characterize and compare SS-EPs elicited by the selective electrical activation of nociceptive Aδ-fibers and non-nociceptive Aβ-fibers. Nine subjects took part in the experiment. Ten second trains of nociceptive (intra-epidermal electrical stimulation) and non-nociceptive (transcutaneous electrical stimulation) stimuli were applied to the left and right hand in separate blocks. Trains consisted of 0.5 ms constant-current pulses modulated at 3, 7, 13, 23 and 43 Hz. Consistent nociceptive and non-nociceptive SS-EPs were recorded at all stimulation frequencies. Whereas non-nociceptive SS-EPs were maximal over the parietal region contralateral to the stimulated side, nociceptive SS-EPs were maximal at the vertex and symmetrically distributed over both hemispheres, thus indicating that the two responses reflect the entrainment of distinct neuronal populations. The recording of nociceptive and non-nociceptive somatosensory SS-EPs offers a unique opportunity to study the cortical representation of nociception and touch in humans

    Exploring how musical rhythm entrains brain activity with electroencephalogram frequency-tagging

    No full text
    The ability to perceive a regular beat in music and synchronize to this beat is a widespread human skill. Fundamental to musical behaviour, beat and meter refer to the perception of periodicities while listening to musical rhythms and often involve spontaneous entrainment to move on these periodicities. Here, we present a novel experimental approach inspired by the frequency-tagging approach to understand the perception and production of rhythmic inputs. This approach is illustrated here by recording the human electroencephalogram responses at beat and meter frequencies elicited in various contexts: mental imagery of meter, spontaneous induction of a beat from rhythmic patterns, multisensory integration and sensorimotor synchronization. Collectively, our observations support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. More generally, they highlight the potential of this approach to help us understand the link between the phenomenology of musical beat and meter and the bias towards periodicities arising under certain circumstances in the nervous system. Entrainment to music provides a highly valuable framework to explore general entrainment mechanisms as embodied in the human brain
    • …
    corecore